221 research outputs found

    INFRISK : a computer simulation approach to risk management in infrastructure project finance transactions

    Get PDF
    Few issues in modern finance have inspired the interest of both practitioners and theoreticians more than risk evaluation and management. The basic principle governing risk management in an infrastructure project finance deal is intuitive and well-articulated: allocate project-specific risks to parties best able to bear them (taking into account each party's appetite for, and aversion to, risk); control performance risk through incentives; and use market hedging instruments (derivatives) for covering marketwide risks arising from fluctuations in, for instance, interest and exchange rates, among other things. In practice, however, governments have been asked to provide guarantees for various kinds of projects, often at no charge, because of problems associated with market imperfections: a) Derivative markets (swaps, forwards) for currency and interest-rate risk hedging either do not exist or are inadequately developed in most developing countries. b) Limited contracting possibilities (because of problems with credibility of enforcement). c) Differing methods for risk measurement and evaluation. Two factors distinguish the financing of infrastructure projects from corporate and traditional limited-recourse project finance: 1) a high concentration of project risk early in the project life cycle (pre-completion), and 2) a risk profile that changes as the project comes to fruition, with a relatively stable cash flow subject to market and regulatory risk once the project is completed. The authors introduce INFRISK, a computer-based risk-management approach to infrastructure project transactions that involve the private sector. Developed in-house in the Economic Development Institute of the World Bank, INFRISK is a guide to practitioners in the field and a training tool for raising awareness and improving expertise in the application of modern risk management techniques. INFRISK can analyze a project's exposure to a variety of market, credit, and performance risks form the perspective of key contracting parties (project promoter, creditor, and government). Their model is driven by the concept of the project's economic viability. Drawing on recent developments in the literature on project evaluation under uncertainty, INFRISK generates probability distributions for key decision variables, such as a project's net present value, internal rate of return, or capacity to service its debt on time during the life of the project. Computationally, INFRISK works in conjunction with Microsoft Excel and supports both the construction and the operation phases of a capital investment project. For a particular risk variable of interest (such as the revenue stream, operations and maintenance costs, and construction costs, among others) the program first generates a stream of probability of distributions for each year of a project's life through a Monte Carlo simulation technique. One of the key contributions made by INFRISK is to enable the use of a broader set of probability distributions (uniform, normal, beta, and lognormal) in conducting Monte Carlo simulations rather than relying only on the commonly used normal distribution. A user's guide provides instruction on the use of the package.Banks&Banking Reform,Economic Theory&Research,Environmental Economics&Policies,Payment Systems&Infrastructure,Public Sector Economics&Finance,Financial Intermediation,Banks&Banking Reform,Environmental Economics&Policies,Economic Theory&Research,Public Sector Economics&Finance

    A dusty pinwheel nebula around the massive star WR 104

    Get PDF
    Wolf-Rayet (WR) stars are luminous massive blue stars thought to be immediate precursors to the supernova terminating their brief lives. The existence of dust shells around such stars has been enigmatic since their discovery some 30 years ago; the intense radiation field from the star should be inimical to dust survival. Although dust-creation models, including those involving interacting stellar winds from a companion star, have been put forward, high-resolution observations are required to understand this phenomena. Here we present resolved images of the dust outflow around Wolf-Rayet WR 104, obtained with novel imaging techniques, revealing detail on scales corresponding to about 40 AU at the star. Our maps show that the dust forms a spatially confined stream following precisely a linear (or Archimedian) spiral trajectory. Images taken at two separate epochs show a clear rotation with a period of 220 +/- 30 days. Taken together, these findings prove that a binary star is responsible for the creation of the circumstellar dust, while the spiral plume makes WR 104 the prototype of a new class of circumstellar nebulae unique to interacting wind systems.Comment: 7 pages, 2 figures, Appearing in Nature (1999 April 08

    Returning home: heritage work among the Stl'atl'imx of the Lower Lillooet River Valley

    Get PDF
    This article focusses on heritage practices in the tensioned landscape of the Stl’atl’imx (pronounced Stat-lee-um) people of the Lower Lillooet River Valley, British Columbia, Canada. Displaced from their traditional territories and cultural traditions through the colonial encounter, they are enacting, challenging and remaking their heritage as part of their long term goal to reclaim their land and return ‘home’. I draw on three examples of their heritage work: graveyard cleaning, the shifting ‘official’/‘unofficial’ heritage of a wagon road, and marshalling of the mountain named Nsvq’ts (pronounced In-SHUCK-ch) in order to illustrate how the past is strategically mobilised in order to substantiate positions in the present. While this paper focusses on heritage in an Indigenous and postcolonial context, I contend that the dynamics of heritage practices outlined here are applicable to all heritage practices

    Comparison of Pittsburgh compound B and florbetapir in cross-sectional and longitudinal studies.

    Get PDF
    IntroductionQuantitative in vivo measurement of brain amyloid burden is important for both research and clinical purposes. However, the existence of multiple imaging tracers presents challenges to the interpretation of such measurements. This study presents a direct comparison of Pittsburgh compound B-based and florbetapir-based amyloid imaging in the same participants from two independent cohorts using a crossover design.MethodsPittsburgh compound B and florbetapir amyloid PET imaging data from three different cohorts were analyzed using previously established pipelines to obtain global amyloid burden measurements. These measurements were converted to the Centiloid scale to allow fair comparison between the two tracers. The mean and inter-individual variability of the two tracers were compared using multivariate linear models both cross-sectionally and longitudinally.ResultsGlobal amyloid burden measured using the two tracers were strongly correlated in both cohorts. However, higher variability was observed when florbetapir was used as the imaging tracer. The variability may be partially caused by white matter signal as partial volume correction reduces the variability and improves the correlations between the two tracers. Amyloid burden measured using both tracers was found to be in association with clinical and psychometric measurements. Longitudinal comparison of the two tracers was also performed in similar but separate cohorts whose baseline amyloid load was considered elevated (i.e., amyloid positive). No significant difference was detected in the average annualized rate of change measurements made with these two tracers.DiscussionAlthough the amyloid burden measurements were quite similar using these two tracers as expected, difference was observable even after conversion into the Centiloid scale. Further investigation is warranted to identify optimal strategies to harmonize amyloid imaging data acquired using different tracers

    The Clinical Promise of Biomarkers of Synapse Damage or Loss in Alzheimer’s Disease

    Get PDF
    BACKGROUND: Synapse damage and loss are fundamental to the pathophysiology of Alzheimer's disease (AD) and lead to reduced cognitive function. The goal of this review is to address the challenges of forging new clinical development approaches for AD therapeutics that can demonstrate reduction of synapse damage or loss. The key points of this review include the following: Synapse loss is a downstream effect of amyloidosis, tauopathy, inflammation, and other mechanisms occurring in AD.Synapse loss correlates most strongly with cognitive decline in AD because synaptic function underlies cognitive performance.Compounds that halt or reduce synapse damage or loss have a strong rationale as treatments of AD.Biomarkers that measure synapse degeneration or loss in patients will facilitate clinical development of such drugs.The ability of methods to sensitively measure synapse density in the brain of a living patient through synaptic vesicle glycoprotein 2A (SV2A) positron emission tomography (PET) imaging, concentrations of synaptic proteins (e.g., neurogranin or synaptotagmin) in the cerebrospinal fluid (CSF), or functional imaging techniques such as quantitative electroencephalography (qEEG) provides a compelling case to use these types of measurements as biomarkers that quantify synapse damage or loss in clinical trials in AD. CONCLUSION: A number of emerging biomarkers are able to measure synapse injury and loss in the brain and may correlate with cognitive function in AD. These biomarkers hold promise both for use in diagnostics and in the measurement of therapeutic successes

    Range shifts or extinction? Ancient DNA and distribution modelling reveal past and future responses to climate warming in cold-adapted birds.

    Get PDF
    Global warming is predicted to cause substantial habitat rearrangements, with the most severe effects expected to occur in high-latitude biomes. However, one major uncertainty is whether species will be able to shift their ranges to keep pace with climate-driven environmental changes. Many recent studies on mammals have shown that past range contractions have been associated with local extinctions rather than survival by habitat tracking. Here, we have used an interdisciplinary approach that combines ancient DNA techniques, coalescent simulations and species distribution modelling, to investigate how two common cold-adapted bird species, willow and rock ptarmigan (Lagopus lagopus and Lagopus muta), respond to long-term climate warming. Contrary to previous findings in mammals, we demonstrate a genetic continuity in Europe over the last 20 millennia. Results from back-casted species distribution models suggest that this continuity may have been facilitated by uninterrupted habitat availability and potentially also the greater dispersal ability of birds. However, our predictions show that in the near future, some isolated regions will have little suitable habitat left, implying a future decrease in local populations at a scale unprecedented since the last glacial maximum

    A clinical and economic evaluation of Control of Hyperglycaemia in Paediatric intensive care (CHiP): a randomised controlled trial.

    Get PDF
    BACKGROUND: Early research in adults admitted to intensive care suggested that tight control of blood glucose during acute illness can be associated with reductions in mortality, length of hospital stay and complications such as infection and renal failure. Prior to our study, it was unclear whether or not children could also benefit from tight control of blood glucose during critical illness. OBJECTIVES: This study aimed to determine if controlling blood glucose using insulin in paediatric intensive care units (PICUs) reduces mortality and morbidity and is cost-effective, whether or not admission follows cardiac surgery. DESIGN: Randomised open two-arm parallel group superiority design with central randomisation with minimisation. Analysis was on an intention-to-treat basis. Following random allocation, care givers and outcome assessors were no longer blind to allocation. SETTING: The setting was 13 English PICUs. PARTICIPANTS: Patients who met the following criteria were eligible for inclusion: ≥ 36 weeks corrected gestational age; ≤ 16 years; in the PICU following injury, following major surgery or with critical illness; anticipated treatment > 12 hours; arterial line; mechanical ventilation; and vasoactive drugs. Exclusion criteria were as follows: diabetes mellitus; inborn error of metabolism; treatment withdrawal considered; in the PICU > 5 consecutive days; and already in CHiP (Control of Hyperglycaemia in Paediatric intensive care). INTERVENTION: The intervention was tight glycaemic control (TGC): insulin by intravenous infusion titrated to maintain blood glucose between 4.0 and 7.0 mmol/l. CONVENTIONAL MANAGEMENT (CM): This consisted of insulin by intravenous infusion only if blood glucose exceeded 12.0 mmol/l on two samples at least 30 minutes apart; insulin was stopped when blood glucose fell below 10.0 mmol/l. MAIN OUTCOME MEASURES: The primary outcome was the number of days alive and free from mechanical ventilation within 30 days of trial entry (VFD-30). The secondary outcomes comprised clinical and economic outcomes at 30 days and 12 months and lifetime cost-effectiveness, which included costs per quality-adjusted life-year. RESULTS: CHiP recruited from May 2008 to September 2011. In total, 19,924 children were screened and 1369 eligible patients were randomised (TGC, 694; CM, 675), 60% of whom were in the cardiac surgery stratum. The randomised groups were comparable at trial entry. More children in the TGC than in the CM arm received insulin (66% vs. 16%). The mean VFD-30 was 23 [mean difference 0.36; 95% confidence interval (CI) -0.42 to 1.14]. The effect did not differ among prespecified subgroups. Hypoglycaemia occurred significantly more often in the TGC than in the CM arm (moderate, 12.5% vs. 3.1%; severe, 7.3% vs. 1.5%). Mean 30-day costs were similar between arms, but mean 12-month costs were lower in the TGC than in CM arm (incremental costs -£3620, 95% CI -£7743 to £502). For the non-cardiac surgery stratum, mean costs were lower in the TGC than in the CM arm (incremental cost -£9865, 95% CI -£18,558 to -£1172), but, in the cardiac surgery stratum, the costs were similar between the arms (incremental cost £133, 95% CI -£3568 to £3833). Lifetime incremental net benefits were positive overall (£3346, 95% CI -£11,203 to £17,894), but close to zero for the cardiac surgery stratum (-£919, 95% CI -£16,661 to £14,823). For the non-cardiac surgery stratum, the incremental net benefits were high (£11,322, 95% CI -£15,791 to £38,615). The probability that TGC is cost-effective is relatively high for the non-cardiac surgery stratum, but, for the cardiac surgery subgroup, the probability that TGC is cost-effective is around 0.5. Sensitivity analyses showed that the results were robust to a range of alternative assumptions. CONCLUSIONS: CHiP found no differences in the clinical or cost-effectiveness of TGC compared with CM overall, or for prespecified subgroups. A higher proportion of the TGC arm had hypoglycaemia. This study did not provide any evidence to suggest that PICUs should stop providing CM for children admitted to PICUs following cardiac surgery. For the subgroup not admitted for cardiac surgery, TGC reduced average costs at 12 months and is likely to be cost-effective. Further research is required to refine the TGC protocol to minimise the risk of hypoglycaemic episodes and assess the long-term health benefits of TGC. TRIAL REGISTRATION: Current Controlled Trials ISRCTN61735247. FUNDING: This project was funded by the NIHR Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 18, No. 26. See the NIHR Journals Library website for further project information

    European achievements in soil remediation and brownfield redevelopment

    Get PDF
    With the aim of sharing best practices of soil restoration and management of contaminated sites among European countries and to raise awareness of the enormous efforts made to succeed in such difficult commitment, the experts of the EIONET Soil working group on contaminated sites and brownfields agreed to gather their country's interesting cases and successful stories of recovery of contaminated areas. This second edition of the monograph presents seventeen new cases from eight European countries and its Regions of how polluted sites and brownfields have been remediated like new methodologies of sustainable restoration of the subsoil, development of innovative technologies, and funding mechanisms etc. These stories have been compiled to present what national, regional or local governments are doing to improve the quality of the environment and the living conditions of their population. A second aim is the promotion of best practices among industry, consultancies and business operators.JRC.D.3-Land Resource
    corecore